49,225 research outputs found

    Extended truncated Tweedie-Poisson model

    Get PDF
    It has been argued that by truncating the sample space of the negative binomial and of the inverse Gaussian-Poisson mixture models at zero, one is allowed to extend the parameter space of the model. Here that is proved to be the case for the more general three parameter Tweedie-Poisson mixture model. It is also proved that the distributions in the extended part of the parameter space are not the zero truncation of mixed poisson distributions and that, other than for the negative binomial, they are not mixtures of zero truncated Poisson distributions either. By extending the parameter space one can improve the fit when the frequency of one is larger and the right tail is heavier than is allowed by the unextended model. Considering the extended model also allows one to use the basic maximum likelihood based inference tools when parameter estimates fall in the extended part of the parameter space, and hence when the m.l.e. does not exist under the unextended model. This extended truncated Tweedie-Poisson model is proved to be useful in the analysis of words and species frequency count data

    Double truncated poisson regression model with random effects

    Get PDF
    Count data regression models are used for special cases where the response variable takes count values or only non-negative values. Poisson regression models are commonly used to analyze count data. A frequent problem with the use of these models is that the observed variation is greater than expected and mixed Poisson models are alternative models that provide a means of explaining the extra-Poisson variation. Mixed Poisson regression models have extensive research and literature studies, and have been commonly used in fields such as epidemiology, medicine, genetics, economics, engineering, marketing, and in the physical and social sciences. However, in many cases, the analyst does not observe the entire distribution of counts. In such a case, the count data are truncated as the data are observed only over part of the range of the response variable. In this study, we formulate a class of regression models based on a Double Truncated Poisson regression model with random effects. Two different distributions for the random effects, Normal and Gamma, were studied through simulation. Misspecification of these distributions was addressed. Comparisons with the Left Truncated Mixed Poisson model and the regular Mixed Poisson model were presented. It was concluded that with Normal random effects, double and Left Truncated Mixed Poisson models provide a better fit to clustered double truncated count data compared to the regular mixed Poisson model. For Gamma random effects, the Double Truncated Mixed Poisson model provides a better fit to clustered double truncated count data. These models were used to analyze a Transitional Housing Facility data set

    Design and sparing techniques to meet specified performance life

    Get PDF
    Specified performance life technique starts with the general description of what is wanted, defines in block diagram the operational needs, and then defines the functional systems required. The technique is similar to a truncated reliability model, but the calculation is simplified by use of a Poisson distribution approach to failure probability

    Bayesian Models for Zero Truncated Count Data. Asian Journal of Probability and Statistics.

    Get PDF
    It is important to fit count data with suitable model(s), models such as Poisson Regression, Quassi Poisson, Negative Binomial, to mention but a few have been adopted by researchers to fit zero truncated count data in the past. In recent times, dedicated models for fitting zero truncated count data have been developed, and they are considered sufficient. This study proposed Bayesian multi-level Poisson and Bayesian multi-level Geometric model, Bayesian Monte Carlo Markov Chain Generalized linear Mixed Models (MCMCglmms) of zero truncated Poisson and MCMCglmms Poisson regression model to fit health count data that is truncated at zero. Suitable model selection criteria were used to determine preferred models for fitting zero truncated data. Results obtained showed that Bayesian multi-level Poisson outperformed Bayesian multi-level Poisson Geometric model; also MCMCglmms of zero truncated Poisson outperformed MCMCglmms Poisson

    Scaling Models for the Severity and Frequency of External Operational Loss Data

    Get PDF
    According to Basel II criteria, the use of external data is absolutely indispensable to the implementation of an advanced method for calculating operational capital. This article investigates how the severity and frequencies of external losses are scaled for integration with internal data. We set up an initial model designed to explain the loss severity. This model takes into account firm size, location, and business lines as well as risk types. It also shows how to calculate the internal loss equivalent to an external loss, which might occur in a given bank. OLS estimation results show that the above variables have significant power in explaining the loss amount. They are used to develop a normalization formula. A second model based on external data is developed to scale the frequency of losses over a given period. Two regression models are analyzed: the truncated Poisson model and the truncated negative binomial model. Variables estimating the size and geographical distribution of the banks' activities have been introduced as explanatory variables. The results show that the negative binomial distribution outperforms the Poisson distribution. The scaling is done by calculating the parameters of the selected distribution based on the estimated coefficients and the variables related to a given bank. Frequency of losses of more than $1 million are generated on a specific horizon.Operational risk in banks, scaling, severity distribution, frequency distribution, truncated count data regression models

    Image reconstruction in fluorescence molecular tomography with sparsity-initialized maximum-likelihood expectation maximization

    Get PDF
    We present a reconstruction method involving maximum-likelihood expectation maximization (MLEM) to model Poisson noise as applied to fluorescence molecular tomography (FMT). MLEM is initialized with the output from a sparse reconstruction-based approach, which performs truncated singular value decomposition-based preconditioning followed by fast iterative shrinkage-thresholding algorithm (FISTA) to enforce sparsity. The motivation for this approach is that sparsity information could be accounted for within the initialization, while MLEM would accurately model Poisson noise in the FMT system. Simulation experiments show the proposed method significantly improves images qualitatively and quantitatively. The method results in over 20 times faster convergence compared to uniformly initialized MLEM and improves robustness to noise compared to pure sparse reconstruction. We also theoretically justify the ability of the proposed approach to reduce noise in the background region compared to pure sparse reconstruction. Overall, these results provide strong evidence to model Poisson noise in FMT reconstruction and for application of the proposed reconstruction framework to FMT imaging

    Intensity of competition and market structure in the Italian banking industry

    Get PDF
    The aim of this paper is to test the predictions of Sutton's model of independent submarkets for the Italian retail banking industry. This industry, in fact, can be viewed as made of a large number of local markets corresponding to different geographical locations. In order to do that, I first develop a model of endogenous mergers that shows how the number of firms is determined by the initial number of firms, by the intensity of competition, and by the degree of product differentiation, and how this in turn affects the one-firm concentration index. Then, in the second part, the number of banks in each submarket is estimated using a truncated model and a Poisson model. The size of the submarkets turned out to be at most provincial. Finally, the one-firm concentration ratio of each province is regressed on the number of banks, also in interaction with market size variables. As argued by Sutton for industries with exogenous sunk costs, a stronger and negative relationship is found as the market becomes larger.exogenous sunk costs, intensity of competition, concentration, truncated and Poisson models

    Pemodelan Regresi Nonparametrik Spline Poisson Pada Tingkat Kematian Bayi di Sulawesi Selatan

    Get PDF
    Poisson regression analysis is a method used to analyze the relationship between predictor variables and response variables with a Poisson distribution. However, not all data have an orderly pattern, so the Poisson regression is not appropriate to use. To solve this problem, a multivariable Poisson nonparametric regression with a spline truncated estimator was used. In this research, the estimation parameters of multivariable Poisson nonparametric regression was applied to data of infant mortality rates in South Sulawesi in 2017. The infant mortality rate can be measured from the number of infant deaths under one year. The method of selecting the optimal knot point uses the Generalized Cross Validation (GCV) method. The best model is formed on a linear spline model with one knot point. Based on the estimation of the parameters formed, it shows that the variable number of babies with low birth weight (x1) and the number of infants who are exclusively breastfed (x3) significantly affect the number of infant deaths.  Keywords: GCV, Multivariable Nonparametric Regression, Poisson, Spline Truncated, Total Infant Mortality

    Modeling Sage data with a truncated gamma-Poisson model

    Get PDF
    BACKGROUND: Serial Analysis of Gene Expressions (SAGE) produces gene expression measurements on a discrete scale, due to the finite number of molecules in the sample. This means that part of the variance in SAGE data should be understood as the sampling error in a binomial or Poisson distribution, whereas other variance sources, in particular biological variance, should be modeled using a continuous distribution function, i.e. a prior on the intensity of the Poisson distribution. One challenge is that such a model predicts a large number of genes with zero counts, which cannot be observed. RESULTS: We present a hierarchical Poisson model with a gamma prior and three different algorithms for estimating the parameters in the model. It turns out that the rate parameter in the gamma distribution can be estimated on the basis of a single SAGE library, whereas the estimate of the shape parameter becomes unstable. This means that the number of zero counts cannot be estimated reliably. When a bivariate model is applied to two SAGE libraries, however, the number of predicted zero counts becomes more stable and in approximate agreement with the number of transcripts observed across a large number of experiments. In all the libraries we analyzed there was a small population of very highly expressed tags, typically 1% of the tags, that could not be accounted for by the model. To handle those tags we chose to augment our model with a non-parametric component. We also show some results based on a log-normal distribution instead of the gamma distribution. CONCLUSION: By modeling SAGE data with a hierarchical Poisson model it is possible to separate the sampling variance from the variance in gene expression. If expression levels are reported at the gene level rather than at the tag level, genes mapped to multiple tags must be kept separate, since their expression levels show a different statistical behavior. A log-normal prior provided a better fit to our data than the gamma prior, but except for a small subpopulation of tags with very high counts, the two priors are similar
    corecore